The Mixing method: coordinate descent for low-rank semidefinite programming
نویسندگان
چکیده
In this paper, we propose a coordinate descent approach to low-rank structured semidefinite programming. The approach, which we call the Mixing method, is extremely simple to implement, has no free parameters, and typically attains an order of magnitude or better improvement in optimization performance over the current state of the art. We show that for certain problems, the method is strictly decreasing and guaranteed to converge to a critical point. We then apply the algorithm to three separate domains: solving the maximum cut semidefinite relaxation, solving a (novel) maximum satisfiability relaxation, and solving the GloVe word embedding optimization problem. In all settings, we demonstrate improvement over the existing state of the art along various dimensions. In total, this work substantially expands the scope and scale of problems that can be solved using semidefinite programming methods.
منابع مشابه
A Coordinate Ascent Method for Solving Semidefinite Relaxations of Non-convex Quadratic Integer Programs
We present a coordinate ascent method for a class of semidefinite programming problems that arise in non-convex quadratic integer optimization. These semidefinite programs are characterized by a small total number of active constraints and by low-rank constraint matrices. We exploit this special structure by solving the dual problem, using a barrier method in combination with a coordinate-wise ...
متن کاملEfficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs
We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the project...
متن کاملPractical first order methods for large scale semidefinite programming
This paper investigates first order methods for solving large scale semidefinite programs. While interior point methods are (a) theoretically sound and (b) effective and robust in practice, they are only practical for small scale problems. As the dimension of the problem increases, both the space and time needed become prohibitive. We survey first order methods which have been proposed in the l...
متن کاملA Convergent Gradient Descent Algorithm for Rank Minimization and Semidefinite Programming from Random Linear Measurements
We propose a simple, scalable, and fast gradient descent algorithm to optimize a nonconvex objective for the rank minimization problem and a closely related family of semidefinite programs. With O(rκn log n) random measurements of a positive semidefinite n×nmatrix of rank r and condition number κ, our method is guaranteed to converge linearly to the global optimum.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1706.00476 شماره
صفحات -
تاریخ انتشار 2017